|
Differential entropy (also referred to as continuous entropy) is a concept in information theory that extends the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions. ==Definition== Let ''X'' be a random variable with a probability density function ''f'' whose support is a set . The ''differential entropy'' ''h''(''X'') or ''h''(''f'') is defined as :. For probability distributions which don't have an explicit density function expression, but have an explicit quantile function expression, ''Q''(''p''), then ''h''(''Q'') can be defined in terms of the derivative of ''Q''(''p'') i.e. the quantile density function ''Q(''p'') as :. As with its discrete analog, the units of differential entropy depend on the base of the logarithm, which is usually 2 (i.e., the units are bits). See logarithmic units for logarithms taken in different bases. Related concepts such as joint, conditional differential entropy, and relative entropy are defined in a similar fashion. Unlike the discrete analog, the differential entropy has an offset that depends on the units used to measure ''X''.〔Pages 183-184, 〕 For example, the differential entropy of a quantity measured in millimeters will be log(1000) more than the same quantity measured in meters; a dimensionless quantity will have differential entropy of log(1000) more than the same quantity divided by 1000. One must take care in trying to apply properties of discrete entropy to differential entropy, since probability density functions can be greater than 1. For example, Uniform(0,1/2) has ''negative'' differential entropy :. Thus, differential entropy does not share all properties of discrete entropy. Note that the continuous mutual information ''I''(''X'';''Y'') has the distinction of retaining its fundamental significance as a measure of discrete information since it is actually the limit of the discrete mutual information of ''partitions'' of ''X'' and ''Y'' as these partitions become finer and finer. Thus it is invariant under non-linear homeomorphisms (continuous and uniquely invertible maps) , including linear transformations of ''X'' and ''Y'', and still represents the amount of discrete information that can be transmitted over a channel that admits a continuous space of values. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「differential entropy」の詳細全文を読む スポンサード リンク
|